intimate deepfake
One in four unconcerned by sexual deepfakes created without consent, survey finds
The report found 7% of respondents had been depicted in a sexual or intimate deepfake. The report found 7% of respondents had been depicted in a sexual or intimate deepfake. One in four people think there is nothing wrong with creating and sharing sexual deepfakes, or they feel neutral about it, even when the person depicted has not consented, according to a police-commissioned survey. The findings prompted a senior police officer to warn that the use of AI is accelerating an epidemic in violence against women and girls (VAWG), and that technology companies are complicit in this abuse. The survey of 1,700 people commissioned by the office of the police chief scientific adviser found 13% felt there was nothing wrong with creating and sharing sexual or intimate deepfakes - digitally altered content made using AI without consent.
- North America > United States (0.17)
- Europe > United Kingdom (0.16)
- Europe > Ukraine (0.07)
- Oceania > Australia (0.05)
The US Senate unanimously passes a bill to empower victims of intimate deepfakes
The US Senate unanimously passed a bill on Tuesday designed to hold accountable those who make or share deepfake porn. The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act) would allow victims to sue those who create, share or possess AI-generated sexual images or videos using their likeness. The issue took root in the public consciousness after the infamous Taylor Swift deepfake that circulated among online lowlifes early this year. The bill would let victims sue for up to 150,000 in damages. That number grows to 250,000 if it's related to attempted sexual assault, stalking or harassment.
- Information Technology > Security & Privacy (1.00)
- Government > Regional Government > North America Government > United States Government (0.90)
- Law > Criminal Law (0.76)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.62)